Continuous Steepest Descent Path for Traversing Non-convex Regions

نویسنده

  • S. Beddiaf
چکیده

This paper revisits the ideas of seeking unconstrained minima by following a continuous steepest descent path (CSDP). We are especially interested in the merits of such an approach in regions where the objective function is non-convex and Newton-like methods become ineffective. The paper combines ODE-trajectory following with trust-region ideas to give an algorithm which performs curvilinear searches on each iteration. Progress along the CSDP is governed both by the decrease in function value and measures of the accuracy of a local quadratic model. Experience with a prototype implementation of the algorithm is promising and it is shown to be competitive with more conventional line search and trust region approaches. In particular, it is also shown to perform well in comparison with the, superficially similar, gradient-flow method proposed by Behrman.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Traversing non-convex regions

This paper considers a method for dealing with non-convex objective functions in optimization problems. It uses the Hessian matrix and combines features of trust-region techniques and continuous steepest descent trajectory-following in order to construct an algorithm which performs curvilinear searches away from the starting point of each iteration. A prototype implementation yields promising r...

متن کامل

MATHEMATICAL ENGINEERING TECHNICAL REPORTS Discrete L-/M-Convex Function Minimization Based on Continuous Relaxation

We consider the problem of minimizing a nonlinear discrete function with L-/M-convexity proposed in the theory of discrete convex analysis. For this problem, steepest descent algorithms and steepest descent scaling algorithms are known. In this paper, we use continuous relaxation approach which minimizes the continuous variable version first in order to find a good initial solution of a steepes...

متن کامل

An Asymptotical Variational Principle Associated with the Steepest Descent Method for a Convex Function

The asymptotical limit of the trajectory deened by the continuous steepest descent method for a proper closed convex function f on a Hilbert space is characterized in the set of minimizers of f via an asymp-totical variational principle of Brezis-Ekeland type. The implicit discrete analogue (prox method) is also considered.

متن کامل

LANCS Workshop on Modelling and Solving Complex Optimisation Problems

Towards optimal Newton-type methods for nonconvex smooth optimization Coralia Cartis Coralia.Cartis (at) ed.ac.uk School of Mathematics, Edinburgh University We show that the steepest-descent and Newton methods for unconstrained non-convex optimization, under standard assumptions, may both require a number of iterations and function evaluations arbitrarily close to the steepest-descent’s global...

متن کامل

Convergence of the Nelder - Mead Simplex Method Toa Non - Stationary Pointk

This paper analyses the behaviour of the Nelder-Mead simplex method for a family of examples which cause the method to converge to a non-stationary point. All the examples use continuous functions of two variables. The family of functions contains strictly convex functions with up to three continuous derivatives. In all the examples the method repeatedly applies the inside contraction step with...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008